Weakly Convex Optimization over Stiefel Manifold Using Riemannian Subgradient-Type Methods

نویسندگان

چکیده

We consider a class of nonsmooth optimization problems over the Stiefel manifold, in which objective function is weakly convex ambient Euclidean space. Such are ubiquitous engineering applications but still largely unexplored. present family Riemannian subgradient-type methods---namely subgradient, incremental and stochastic subgradient methods---to solve these show that they all have an iteration complexity $\mathcal{O}(\varepsilon^{-4})$ for driving natural stationarity measure below $\varepsilon$. In addition, we establish local linear convergence methods when problem at hand further satisfies sharpness property algorithms properly initialized use geometrically diminishing stepsizes. To best our knowledge, first guarantees using to optimize nonconvex functions manifold. The fundamental ingredient proof aforementioned results new inequality restrictions on could be independent interest. also can extended handle compact embedded submanifolds Finally, discuss properties various formulations robust subspace recovery orthogonal dictionary learning demonstrate performance both via numerical simulations.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

"Efficient" Subgradient Methods for General Convex Optimization

A subgradient method is presented for solving general convex optimization problems, the main requirement being that a strictly-feasible point is known. A feasible sequence of iterates is generated, which converges to within user-specified error of optimality. Feasibility is maintained with a linesearch at each iteration, avoiding the need for orthogonal projections onto the feasible region (an ...

متن کامل

Quadratic programs over the Stiefel manifold

We characterize the optimal solution of a quadratic program over the Stiefel manifold with an objective function in trace formulation. The result is applied to relaxations of HQAP and MTLS. Finally, we show that strong duality holds for the Lagrangian dual, provided some redundant constraints are added to the primal program. © 2005 Elsevier B.V. All rights reserved.

متن کامل

A Riemannian conjugate gradient method for optimization on the Stiefel manifold

In this paper we propose a new Riemannian conjugate gradient method for optimization on the Stiefel manifold. We introduce two novel vector transports associated with the retraction constructed by the Cayley transform. Both of them satisfy the Ring-Wirth nonexpansive condition, which is fundamental for convergence analysis of Riemannian conjugate gradient methods, and one of them is also isomet...

متن کامل

Randomized Block Subgradient Methods for Convex Nonsmooth and Stochastic Optimization

Block coordinate descent methods and stochastic subgradient methods have been extensively studied in optimization and machine learning. By combining randomized block sampling with stochastic subgradient methods based on dual averaging ([22, 36]), we present stochastic block dual averaging (SBDA)—a novel class of block subgradient methods for convex nonsmooth and stochastic optimization. SBDA re...

متن کامل

Mirror descent and nonlinear projected subgradient methods for convex optimization

The mirror descent algorithm (MDA) was introduced by Nemirovsky and Yudin for solving convex optimization problems. This method exhibits an e3ciency estimate that is mildly dependent in the decision variables dimension, and thus suitable for solving very large scale optimization problems. We present a new derivation and analysis of this algorithm. We show that the MDA can be viewed as a nonline...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Siam Journal on Optimization

سال: 2021

ISSN: ['1095-7189', '1052-6234']

DOI: https://doi.org/10.1137/20m1321000